Subgradient-based Neural Network for Nonconvex Optimization Problems in Support Vector Machines with Indefinite Kernels
نویسندگان
چکیده
Support vector machines (SVMs) with positive semidefinite kernels yield convex quadratic programming problems. SVMs with indefinite kernels yield nonconvex quadratic programming problems. Most optimization methods for SVMs rely on the convexity of objective functions and are not efficient for solving such nonconvex problems. In this paper, we propose a subgradientbased neural network (SGNN) for the problems cast by SVMs with indefinite kernels. It is shown that the state of the proposed neural network has finite length, and as a consequence it converges toward a singleton. The coincidence between the solution and the slow solution of SGNN is also proved starting from the initial value of SGNN. Moreover, we employ the Lojasiewicz inequality to exploit the convergence rate of trajectory of SGNN. The obtained results show that each trajectory is either exponentially convergent, or convergent in finite time, toward a singleton belonging to the set of constrained critical points through a quantitative evaluation of the Lojasiewicz exponent at the equilibrium points. This method is easy to implement without adding any new parameters. Three benchmark data sets from the University of California, Irvine machine learning repository are used in the numerical tests. Experimental results show the efficiency of the proposed neural network.
منابع مشابه
An Efficient Neurodynamic Scheme for Solving a Class of Nonconvex Nonlinear Optimization Problems
By p-power (or partial p-power) transformation, the Lagrangian function in nonconvex optimization problem becomes locally convex. In this paper, we present a neural network based on an NCP function for solving the nonconvex optimization problem. An important feature of this neural network is the one-to-one correspondence between its equilibria and KKT points of the nonconvex optimizatio...
متن کاملApproximate Stochastic Subgradient Estimation Training for Support Vector Machines
Subgradient algorithms for training support vector machines have been quite successful for solving largescale and online learning problems. However, they have been restricted to linear kernels and strongly convex formulations. This paper describes efficient subgradient approaches without such limitations. Our approaches make use of randomized low-dimensional approximations to nonlinear kernels,...
متن کاملSubspace Learning in Krein Spaces: Complete Kernel Fisher Discriminant Analysis with Indefinite Kernels
Positive definite kernels, such as Gaussian Radial Basis Functions (GRBF), have been widely used in computer vision for designing feature extraction and classification algorithms. In many cases nonpositive definite (npd) kernels and non metric similarity/dissimilarity measures naturally arise (e.g., Hausdorff distance, Kullback Leibler Divergences and Compact Support (CS) Kernels). Hence, there...
متن کاملSupport vector machines with indefinite kernels
Training support vector machines (SVM) with indefinite kernels has recently attracted attention in the machine learning community. This is partly due to the fact that many similarity functions that arise in practice are not symmetric positive semidefinite, i.e. the Mercer condition is not satisfied, or the Mercer condition is difficult to verify. Previous work on training SVM with indefinite ke...
متن کاملLearning SVM Classifiers with Indefinite Kernels
Recently, training support vector machines with indefinite kernels has attracted great attention in the machine learning community. In this paper, we tackle this problem by formulating a joint optimization model over SVM classifications and kernel principal component analysis. We first reformulate the kernel principal component analysis as a general kernel transformation framework, and then inc...
متن کامل